Web Survey Bibliography
In 2010, a task force commissioned by AAPOR produced a report that provided recommendations regarding how the survey research industry should approach the use of opt-in Internet panels. Their primary recommendation stated, ―Researchers should avoid nonprobability online panels when one of the research objectives is to accurately estimate population values‖ (AAPOR 2010, 52). Yet, the AAPOR task force also noted that, ―Despite the widespread use of online panels there is still a great deal that is not known with confidence‖ (54). Indeed, the mode study that arguably attracted the most attention in 2009 was conducted in 2004 (Yeager et al. 2009). Yet, studies conducted using data collected several years ago are unlikely to shed much light on the utility of opt-in Internet surveys today (or the increasing problems with RDD telephone polls). After all, Internet usage has increased significantly during the past several years while the use of landline telephones has declined. More importantly, the methods used to recruit panelists and generate representative samples from opt-in panels are undergoing constant innovation.
In this paper, we present data from a four-mode study carried out in 2010. National surveys were fielded at the same time over the Internet (using an opt-in Internet panel), by telephone with live interviews (using a national RDD sample of landlines and cell phones), by telephone with IVR (landline only), and by mail (using a national sample of residential addresses). Each survey utilized a nearly identical questionnaire soliciting information across a range of political and social indicators, many of which can be validated with surveys fielded by other organizations at the same time. Comparing the findings from the modes to each other and the external indicators, we demonstrate that carefully executed opt-in Internet panels are an increasingly valid way to measure public opinion in the United States.
Conference Homepage (abstract)
Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 66th Annual Conference, 2011 (26)
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Exploring Health-related Experiences and Access to Care: Differences between Online and Telephone Survey...; 2011; Doty, M. M., Peugh, J., Shand-Lubbers, J.
- Using Community Information and Survey Methodology for Bias Reduction to Enhance the Quality of the...; 2011; Harvey, J., Prabhakaran, J., Spera, C., Zhang, Zh.
- Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts.; 2011; Toepoel, V.
- The Influence Of The Direction Of Likert-Type Scales In Web Surveys On Response Behavior In Different...; 2011; Keusch, F.
- An Injured Party?: A Comparison of Political Party Response Formats in Party Identification.; 2011; Schwarz, S., Barlas, F. M., Thomas, R. K., Corso, R. A., Szoc, R.
- Asking Sensitive Questions: Do They Affect Participation In Follow-Up Surveys?; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats...; 2011; Dykema, J., Schaeffer, N. C., Beach, J., Lein, V., Day, B.
- Differential Sampling Based on Historical Individual-Level Data in Online Panels.; 2011; Kelly, R. H.
- Web Survey Live Validations - What Are They Doing?; 2011; Crawford, S. D., McClain, C.
- Comparing Numeric and Text Open-End Responses in Mail and Web Surveys.; 2011; Olson, K., Smyth, J.
- Effects of Response Formats when Measuring Attitudes in Consumer Web Surveys Across Markets.; 2011; Couper, M. P., Nunge, E.
- Re-Examining the Validity of Different Survey Modes for Measuring Public Opinion in the U.S.: Findings...; 2011; Ansolabehere, S., Fraga, B., Schaffner, B. F.
- How to Survey All 14 000 Swedish Local Political Representatives And Get 10 000 Responses.; 2011; Gilljam, M., Granberg, D., Holm, B., Persson, M.
- Measuring User Satisfaction in the Lab: Questionnaire Mode, Physical Location, and Social Presence Concerns...; 2011; Jans, M., Romano, J. C., Ashenfelter, K. T., Krosnick, J. A.
- Interactive interventions in web surveys can increase response accuracy.; 2011; Conrad, F. G.
- Impact on Data Quality of Making Incentives Salient in Web Survey Invitations.; 2011; Zhang, Che.
- Effects of Mode and Incentives on Response Rates, Costs, and Response Quality in a Mixed Mode Survey...; 2011; Stevenson, J., Dykema, J., Kniss, C., Black, P., Moberg, P.
- Effects of Differential Incentives on Response Rates in Four Countries for a Web-based Follow Up Survey...; 2011; McSpurren, K.
- Completing Web Surveys on Cell-enabled iPads.; 2011; Dayton, J., Driscoll, H.
- The Social Aspect of the Digital Divide; 2011; Johnson, E. P.
- Which Technologies Do Respondents Use in Online Surveys – An International Comparison?; 2011; Kaczmirek, L., Behr, D., Bandilla, W.
- Matrix Questionnaire Design to Reduce Measurement Error; 2011; Peytchev, A., Peytcheva, E.
- Race-of-Virtual-Interviewer Effects; 2011; Conrad, F. G., Schober, M. F., Nielsen, D.
- Which Web Survey Respondents Are Most Likely to Click for Clarification?; 2011; Coiner, T., Schober, M. F., Conrad, F. G.
- Providing Clarifying Instructions in a Web Survey; 2011; Redline, C. D.